[ Wed Sep 28 02:22:55 2022 ] using warm up, epoch: 5
[ Wed Sep 28 02:23:09 2022 ] Parameters:
{'work_dir': 'work_dir/ntu60/cview/fc_vel', 'model_saved_name': 'work_dir/ntu60/cview/fc_vel/runs', 'config': 'config/nturgbd-cross-view/fc_vel.yaml', 'phase': 'train', 'save_score': False, 'joint_label': [], 'seed': 1, 'log_interval': 100, 'save_interval': 1, 'save_epoch': 35, 'eval_interval': 5, 'ema': False, 'print_log': True, 'show_topk': [1, 5], 'feeder': 'feeders.feeder_ntu.Feeder', 'num_worker': 48, 'train_feeder_args': {'data_path': 'data/ntu60/NTU60_CV.npz', 'split': 'train', 'debug': False, 'random_choose': False, 'random_shift': False, 'random_move': False, 'window_size': 64, 'normalization': False, 'random_rot': True, 'p_interval': [0.5, 1], 'vel': True, 'bone': False}, 'test_feeder_args': {'data_path': 'data/ntu60/NTU60_CV.npz', 'split': 'test', 'window_size': 64, 'p_interval': [0.95], 'vel': True, 'bone': False, 'debug': False}, 'model': 'model.FC-Chains_L_multi_head_new_12_layers.Model', 'model_args': {'num_class': 60, 'num_point': 25, 'num_person': 2}, 'weights': None, 'ignore_weights': [], 'base_lr': 0.1, 'step': [90, 100], 'device': [6], 'optimizer': 'SGD', 'nesterov': True, 'momentum': 0.9, 'batch_size': 64, 'test_batch_size': 64, 'start_epoch': 0, 'num_epoch': 110, 'weight_decay': 0.0004, 'lr_decay_rate': 0.1, 'warm_up_epoch': 5}

[ Wed Sep 28 02:23:09 2022 ] # Parameters: 2082097
[ Wed Sep 28 02:23:09 2022 ] Training epoch: 1
[ Wed Sep 28 02:26:07 2022 ] 	Mean training loss: 2.6736. loss2: 0.0000. Mean training acc: 29.78%.
[ Wed Sep 28 02:26:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:26:07 2022 ] Eval epoch: 1
[ Wed Sep 28 02:26:41 2022 ] 	Mean test loss of 296 batches: 1.6905801670776832.
[ Wed Sep 28 02:26:41 2022 ] 	Top1: 50.96%
[ Wed Sep 28 02:26:41 2022 ] 	Top5: 85.44%
[ Wed Sep 28 02:26:41 2022 ] Training epoch: 2
[ Wed Sep 28 02:29:36 2022 ] 	Mean training loss: 1.7056. loss2: 0.0000. Mean training acc: 50.45%.
[ Wed Sep 28 02:29:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:29:36 2022 ] Eval epoch: 2
[ Wed Sep 28 02:30:09 2022 ] 	Mean test loss of 296 batches: 1.109945199578195.
[ Wed Sep 28 02:30:10 2022 ] 	Top1: 66.11%
[ Wed Sep 28 02:30:10 2022 ] 	Top5: 93.05%
[ Wed Sep 28 02:30:10 2022 ] Training epoch: 3
[ Wed Sep 28 02:33:05 2022 ] 	Mean training loss: 1.3539. loss2: 0.0000. Mean training acc: 59.56%.
[ Wed Sep 28 02:33:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:33:05 2022 ] Eval epoch: 3
[ Wed Sep 28 02:33:38 2022 ] 	Mean test loss of 296 batches: 1.0168447951610025.
[ Wed Sep 28 02:33:38 2022 ] 	Top1: 69.56%
[ Wed Sep 28 02:33:38 2022 ] 	Top5: 93.90%
[ Wed Sep 28 02:33:38 2022 ] Training epoch: 4
[ Wed Sep 28 02:36:34 2022 ] 	Mean training loss: 1.1837. loss2: 0.0000. Mean training acc: 64.29%.
[ Wed Sep 28 02:36:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:36:34 2022 ] Eval epoch: 4
[ Wed Sep 28 02:37:07 2022 ] 	Mean test loss of 296 batches: 0.8524233510566724.
[ Wed Sep 28 02:37:07 2022 ] 	Top1: 73.72%
[ Wed Sep 28 02:37:07 2022 ] 	Top5: 95.66%
[ Wed Sep 28 02:37:07 2022 ] Training epoch: 5
[ Wed Sep 28 02:40:03 2022 ] 	Mean training loss: 1.0894. loss2: 0.0000. Mean training acc: 66.96%.
[ Wed Sep 28 02:40:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:40:03 2022 ] Eval epoch: 5
[ Wed Sep 28 02:40:36 2022 ] 	Mean test loss of 296 batches: 0.8080102287836977.
[ Wed Sep 28 02:40:36 2022 ] 	Top1: 74.49%
[ Wed Sep 28 02:40:36 2022 ] 	Top5: 96.12%
[ Wed Sep 28 02:40:36 2022 ] Training epoch: 6
[ Wed Sep 28 02:43:32 2022 ] 	Mean training loss: 0.9610. loss2: 0.0000. Mean training acc: 70.39%.
[ Wed Sep 28 02:43:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:43:32 2022 ] Eval epoch: 6
[ Wed Sep 28 02:44:05 2022 ] 	Mean test loss of 296 batches: 0.8134848671789104.
[ Wed Sep 28 02:44:05 2022 ] 	Top1: 75.48%
[ Wed Sep 28 02:44:05 2022 ] 	Top5: 95.20%
[ Wed Sep 28 02:44:05 2022 ] Training epoch: 7
[ Wed Sep 28 02:47:01 2022 ] 	Mean training loss: 0.8861. loss2: 0.0000. Mean training acc: 72.57%.
[ Wed Sep 28 02:47:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:47:01 2022 ] Eval epoch: 7
[ Wed Sep 28 02:47:34 2022 ] 	Mean test loss of 296 batches: 0.7655173175641008.
[ Wed Sep 28 02:47:34 2022 ] 	Top1: 75.85%
[ Wed Sep 28 02:47:34 2022 ] 	Top5: 96.42%
[ Wed Sep 28 02:47:34 2022 ] Training epoch: 8
[ Wed Sep 28 02:50:29 2022 ] 	Mean training loss: 0.8451. loss2: 0.0000. Mean training acc: 73.86%.
[ Wed Sep 28 02:50:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:50:29 2022 ] Eval epoch: 8
[ Wed Sep 28 02:51:02 2022 ] 	Mean test loss of 296 batches: 0.8309828658160325.
[ Wed Sep 28 02:51:02 2022 ] 	Top1: 73.85%
[ Wed Sep 28 02:51:02 2022 ] 	Top5: 95.65%
[ Wed Sep 28 02:51:02 2022 ] Training epoch: 9
[ Wed Sep 28 02:53:58 2022 ] 	Mean training loss: 0.8221. loss2: 0.0000. Mean training acc: 74.56%.
[ Wed Sep 28 02:53:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:53:58 2022 ] Eval epoch: 9
[ Wed Sep 28 02:54:31 2022 ] 	Mean test loss of 296 batches: 0.7079419028517362.
[ Wed Sep 28 02:54:31 2022 ] 	Top1: 77.19%
[ Wed Sep 28 02:54:31 2022 ] 	Top5: 96.57%
[ Wed Sep 28 02:54:31 2022 ] Training epoch: 10
[ Wed Sep 28 02:57:26 2022 ] 	Mean training loss: 0.7898. loss2: 0.0000. Mean training acc: 75.53%.
[ Wed Sep 28 02:57:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:57:26 2022 ] Eval epoch: 10
[ Wed Sep 28 02:57:59 2022 ] 	Mean test loss of 296 batches: 0.6506384884988939.
[ Wed Sep 28 02:57:59 2022 ] 	Top1: 79.41%
[ Wed Sep 28 02:58:00 2022 ] 	Top5: 97.08%
[ Wed Sep 28 02:58:00 2022 ] Training epoch: 11
[ Wed Sep 28 03:00:55 2022 ] 	Mean training loss: 0.7730. loss2: 0.0000. Mean training acc: 76.09%.
[ Wed Sep 28 03:00:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:00:55 2022 ] Eval epoch: 11
[ Wed Sep 28 03:01:28 2022 ] 	Mean test loss of 296 batches: 0.7607838430920163.
[ Wed Sep 28 03:01:28 2022 ] 	Top1: 76.74%
[ Wed Sep 28 03:01:28 2022 ] 	Top5: 96.31%
[ Wed Sep 28 03:01:28 2022 ] Training epoch: 12
[ Wed Sep 28 03:04:24 2022 ] 	Mean training loss: 0.7617. loss2: 0.0000. Mean training acc: 76.19%.
[ Wed Sep 28 03:04:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:04:24 2022 ] Eval epoch: 12
[ Wed Sep 28 03:04:57 2022 ] 	Mean test loss of 296 batches: 0.7426936159262786.
[ Wed Sep 28 03:04:57 2022 ] 	Top1: 77.26%
[ Wed Sep 28 03:04:57 2022 ] 	Top5: 96.19%
[ Wed Sep 28 03:04:57 2022 ] Training epoch: 13
[ Wed Sep 28 03:07:53 2022 ] 	Mean training loss: 0.7476. loss2: 0.0000. Mean training acc: 76.75%.
[ Wed Sep 28 03:07:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:07:53 2022 ] Eval epoch: 13
[ Wed Sep 28 03:08:26 2022 ] 	Mean test loss of 296 batches: 0.6245662680248151.
[ Wed Sep 28 03:08:26 2022 ] 	Top1: 79.92%
[ Wed Sep 28 03:08:26 2022 ] 	Top5: 97.24%
[ Wed Sep 28 03:08:26 2022 ] Training epoch: 14
[ Wed Sep 28 03:11:21 2022 ] 	Mean training loss: 0.7331. loss2: 0.0000. Mean training acc: 77.19%.
[ Wed Sep 28 03:11:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:11:21 2022 ] Eval epoch: 14
[ Wed Sep 28 03:11:54 2022 ] 	Mean test loss of 296 batches: 0.9854030093631229.
[ Wed Sep 28 03:11:54 2022 ] 	Top1: 69.79%
[ Wed Sep 28 03:11:54 2022 ] 	Top5: 93.49%
[ Wed Sep 28 03:11:55 2022 ] Training epoch: 15
[ Wed Sep 28 03:14:50 2022 ] 	Mean training loss: 0.7161. loss2: 0.0000. Mean training acc: 77.48%.
[ Wed Sep 28 03:14:50 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:14:50 2022 ] Eval epoch: 15
[ Wed Sep 28 03:15:24 2022 ] 	Mean test loss of 296 batches: 0.7644872043684527.
[ Wed Sep 28 03:15:24 2022 ] 	Top1: 76.07%
[ Wed Sep 28 03:15:24 2022 ] 	Top5: 96.37%
[ Wed Sep 28 03:15:24 2022 ] Training epoch: 16
[ Wed Sep 28 03:18:20 2022 ] 	Mean training loss: 0.7046. loss2: 0.0000. Mean training acc: 78.15%.
[ Wed Sep 28 03:18:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:18:20 2022 ] Eval epoch: 16
[ Wed Sep 28 03:18:53 2022 ] 	Mean test loss of 296 batches: 0.6857172547764069.
[ Wed Sep 28 03:18:53 2022 ] 	Top1: 78.24%
[ Wed Sep 28 03:18:54 2022 ] 	Top5: 96.53%
[ Wed Sep 28 03:18:54 2022 ] Training epoch: 17
[ Wed Sep 28 03:21:50 2022 ] 	Mean training loss: 0.6962. loss2: 0.0000. Mean training acc: 78.28%.
[ Wed Sep 28 03:21:50 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:21:50 2022 ] Eval epoch: 17
[ Wed Sep 28 03:22:23 2022 ] 	Mean test loss of 296 batches: 0.5596368385327829.
[ Wed Sep 28 03:22:23 2022 ] 	Top1: 81.73%
[ Wed Sep 28 03:22:23 2022 ] 	Top5: 97.96%
[ Wed Sep 28 03:22:23 2022 ] Training epoch: 18
[ Wed Sep 28 03:25:19 2022 ] 	Mean training loss: 0.6878. loss2: 0.0000. Mean training acc: 78.39%.
[ Wed Sep 28 03:25:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:25:19 2022 ] Eval epoch: 18
[ Wed Sep 28 03:25:53 2022 ] 	Mean test loss of 296 batches: 0.6053371701288868.
[ Wed Sep 28 03:25:53 2022 ] 	Top1: 80.74%
[ Wed Sep 28 03:25:53 2022 ] 	Top5: 97.39%
[ Wed Sep 28 03:25:53 2022 ] Training epoch: 19
[ Wed Sep 28 03:28:48 2022 ] 	Mean training loss: 0.6840. loss2: 0.0000. Mean training acc: 78.78%.
[ Wed Sep 28 03:28:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:28:48 2022 ] Eval epoch: 19
[ Wed Sep 28 03:29:21 2022 ] 	Mean test loss of 296 batches: 0.6273819224757923.
[ Wed Sep 28 03:29:21 2022 ] 	Top1: 79.62%
[ Wed Sep 28 03:29:21 2022 ] 	Top5: 97.56%
[ Wed Sep 28 03:29:21 2022 ] Training epoch: 20
[ Wed Sep 28 03:32:17 2022 ] 	Mean training loss: 0.6714. loss2: 0.0000. Mean training acc: 79.01%.
[ Wed Sep 28 03:32:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:32:17 2022 ] Eval epoch: 20
[ Wed Sep 28 03:32:50 2022 ] 	Mean test loss of 296 batches: 0.6422397489281925.
[ Wed Sep 28 03:32:50 2022 ] 	Top1: 79.97%
[ Wed Sep 28 03:32:50 2022 ] 	Top5: 96.89%
[ Wed Sep 28 03:32:50 2022 ] Training epoch: 21
[ Wed Sep 28 03:35:45 2022 ] 	Mean training loss: 0.6612. loss2: 0.0000. Mean training acc: 79.28%.
[ Wed Sep 28 03:35:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:35:45 2022 ] Eval epoch: 21
[ Wed Sep 28 03:36:19 2022 ] 	Mean test loss of 296 batches: 0.6264290549867862.
[ Wed Sep 28 03:36:19 2022 ] 	Top1: 80.22%
[ Wed Sep 28 03:36:19 2022 ] 	Top5: 97.34%
[ Wed Sep 28 03:36:19 2022 ] Training epoch: 22
[ Wed Sep 28 03:39:14 2022 ] 	Mean training loss: 0.6646. loss2: 0.0000. Mean training acc: 79.28%.
[ Wed Sep 28 03:39:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:39:14 2022 ] Eval epoch: 22
[ Wed Sep 28 03:39:47 2022 ] 	Mean test loss of 296 batches: 0.563175900663073.
[ Wed Sep 28 03:39:47 2022 ] 	Top1: 82.25%
[ Wed Sep 28 03:39:47 2022 ] 	Top5: 97.86%
[ Wed Sep 28 03:39:47 2022 ] Training epoch: 23
[ Wed Sep 28 03:42:43 2022 ] 	Mean training loss: 0.6541. loss2: 0.0000. Mean training acc: 79.47%.
[ Wed Sep 28 03:42:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:42:43 2022 ] Eval epoch: 23
[ Wed Sep 28 03:43:16 2022 ] 	Mean test loss of 296 batches: 0.531403670037115.
[ Wed Sep 28 03:43:16 2022 ] 	Top1: 83.42%
[ Wed Sep 28 03:43:16 2022 ] 	Top5: 97.46%
[ Wed Sep 28 03:43:16 2022 ] Training epoch: 24
[ Wed Sep 28 03:46:12 2022 ] 	Mean training loss: 0.6470. loss2: 0.0000. Mean training acc: 79.69%.
[ Wed Sep 28 03:46:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:46:12 2022 ] Eval epoch: 24
[ Wed Sep 28 03:46:45 2022 ] 	Mean test loss of 296 batches: 0.5571309880831757.
[ Wed Sep 28 03:46:45 2022 ] 	Top1: 82.24%
[ Wed Sep 28 03:46:45 2022 ] 	Top5: 97.81%
[ Wed Sep 28 03:46:45 2022 ] Training epoch: 25
[ Wed Sep 28 03:49:40 2022 ] 	Mean training loss: 0.6517. loss2: 0.0000. Mean training acc: 79.44%.
[ Wed Sep 28 03:49:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:49:40 2022 ] Eval epoch: 25
[ Wed Sep 28 03:50:13 2022 ] 	Mean test loss of 296 batches: 0.5951738338414077.
[ Wed Sep 28 03:50:13 2022 ] 	Top1: 80.96%
[ Wed Sep 28 03:50:14 2022 ] 	Top5: 97.82%
[ Wed Sep 28 03:50:14 2022 ] Training epoch: 26
[ Wed Sep 28 03:53:09 2022 ] 	Mean training loss: 0.6354. loss2: 0.0000. Mean training acc: 80.10%.
[ Wed Sep 28 03:53:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:53:09 2022 ] Eval epoch: 26
[ Wed Sep 28 03:53:42 2022 ] 	Mean test loss of 296 batches: 0.5843936520049701.
[ Wed Sep 28 03:53:42 2022 ] 	Top1: 81.66%
[ Wed Sep 28 03:53:43 2022 ] 	Top5: 97.39%
[ Wed Sep 28 03:53:43 2022 ] Training epoch: 27
[ Wed Sep 28 03:56:38 2022 ] 	Mean training loss: 0.6345. loss2: 0.0000. Mean training acc: 80.34%.
[ Wed Sep 28 03:56:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:56:38 2022 ] Eval epoch: 27
[ Wed Sep 28 03:57:12 2022 ] 	Mean test loss of 296 batches: 0.5124533236832232.
[ Wed Sep 28 03:57:12 2022 ] 	Top1: 83.81%
[ Wed Sep 28 03:57:12 2022 ] 	Top5: 98.02%
[ Wed Sep 28 03:57:12 2022 ] Training epoch: 28
[ Wed Sep 28 04:00:07 2022 ] 	Mean training loss: 0.6335. loss2: 0.0000. Mean training acc: 80.17%.
[ Wed Sep 28 04:00:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:00:07 2022 ] Eval epoch: 28
[ Wed Sep 28 04:00:40 2022 ] 	Mean test loss of 296 batches: 0.6287512437918702.
[ Wed Sep 28 04:00:40 2022 ] 	Top1: 79.97%
[ Wed Sep 28 04:00:41 2022 ] 	Top5: 97.53%
[ Wed Sep 28 04:00:41 2022 ] Training epoch: 29
[ Wed Sep 28 04:03:36 2022 ] 	Mean training loss: 0.6344. loss2: 0.0000. Mean training acc: 80.06%.
[ Wed Sep 28 04:03:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:03:36 2022 ] Eval epoch: 29
[ Wed Sep 28 04:04:09 2022 ] 	Mean test loss of 296 batches: 0.6941754874345418.
[ Wed Sep 28 04:04:09 2022 ] 	Top1: 78.86%
[ Wed Sep 28 04:04:10 2022 ] 	Top5: 96.78%
[ Wed Sep 28 04:04:10 2022 ] Training epoch: 30
[ Wed Sep 28 04:07:06 2022 ] 	Mean training loss: 0.6322. loss2: 0.0000. Mean training acc: 80.36%.
[ Wed Sep 28 04:07:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:07:06 2022 ] Eval epoch: 30
[ Wed Sep 28 04:07:39 2022 ] 	Mean test loss of 296 batches: 0.6006671312088901.
[ Wed Sep 28 04:07:39 2022 ] 	Top1: 80.28%
[ Wed Sep 28 04:07:39 2022 ] 	Top5: 97.58%
[ Wed Sep 28 04:07:39 2022 ] Training epoch: 31
[ Wed Sep 28 04:10:34 2022 ] 	Mean training loss: 0.6297. loss2: 0.0000. Mean training acc: 80.32%.
[ Wed Sep 28 04:10:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:10:34 2022 ] Eval epoch: 31
[ Wed Sep 28 04:11:08 2022 ] 	Mean test loss of 296 batches: 0.5847188343570845.
[ Wed Sep 28 04:11:08 2022 ] 	Top1: 81.76%
[ Wed Sep 28 04:11:08 2022 ] 	Top5: 97.26%
[ Wed Sep 28 04:11:08 2022 ] Training epoch: 32
[ Wed Sep 28 04:14:03 2022 ] 	Mean training loss: 0.6117. loss2: 0.0000. Mean training acc: 80.75%.
[ Wed Sep 28 04:14:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:14:03 2022 ] Eval epoch: 32
[ Wed Sep 28 04:14:37 2022 ] 	Mean test loss of 296 batches: 0.5605013933334801.
[ Wed Sep 28 04:14:37 2022 ] 	Top1: 82.32%
[ Wed Sep 28 04:14:37 2022 ] 	Top5: 97.49%
[ Wed Sep 28 04:14:37 2022 ] Training epoch: 33
[ Wed Sep 28 04:17:32 2022 ] 	Mean training loss: 0.6207. loss2: 0.0000. Mean training acc: 80.55%.
[ Wed Sep 28 04:17:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:17:32 2022 ] Eval epoch: 33
[ Wed Sep 28 04:18:06 2022 ] 	Mean test loss of 296 batches: 0.6309277129334372.
[ Wed Sep 28 04:18:06 2022 ] 	Top1: 79.61%
[ Wed Sep 28 04:18:06 2022 ] 	Top5: 97.15%
[ Wed Sep 28 04:18:06 2022 ] Training epoch: 34
[ Wed Sep 28 04:21:02 2022 ] 	Mean training loss: 0.6145. loss2: 0.0000. Mean training acc: 81.08%.
[ Wed Sep 28 04:21:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:21:02 2022 ] Eval epoch: 34
[ Wed Sep 28 04:21:35 2022 ] 	Mean test loss of 296 batches: 0.6667536641093524.
[ Wed Sep 28 04:21:35 2022 ] 	Top1: 78.39%
[ Wed Sep 28 04:21:35 2022 ] 	Top5: 96.69%
[ Wed Sep 28 04:21:35 2022 ] Training epoch: 35
[ Wed Sep 28 04:24:31 2022 ] 	Mean training loss: 0.6120. loss2: 0.0000. Mean training acc: 80.95%.
[ Wed Sep 28 04:24:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:24:31 2022 ] Eval epoch: 35
[ Wed Sep 28 04:25:04 2022 ] 	Mean test loss of 296 batches: 0.697372947190259.
[ Wed Sep 28 04:25:04 2022 ] 	Top1: 78.11%
[ Wed Sep 28 04:25:04 2022 ] 	Top5: 96.87%
[ Wed Sep 28 04:25:04 2022 ] Training epoch: 36
[ Wed Sep 28 04:28:00 2022 ] 	Mean training loss: 0.6154. loss2: 0.0000. Mean training acc: 80.91%.
[ Wed Sep 28 04:28:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:28:00 2022 ] Eval epoch: 36
[ Wed Sep 28 04:28:34 2022 ] 	Mean test loss of 296 batches: 0.5837057062701599.
[ Wed Sep 28 04:28:34 2022 ] 	Top1: 81.83%
[ Wed Sep 28 04:28:34 2022 ] 	Top5: 97.39%
[ Wed Sep 28 04:28:34 2022 ] Training epoch: 37
[ Wed Sep 28 04:31:29 2022 ] 	Mean training loss: 0.6080. loss2: 0.0000. Mean training acc: 80.75%.
[ Wed Sep 28 04:31:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:31:29 2022 ] Eval epoch: 37
[ Wed Sep 28 04:32:02 2022 ] 	Mean test loss of 296 batches: 0.5394592389867112.
[ Wed Sep 28 04:32:02 2022 ] 	Top1: 82.65%
[ Wed Sep 28 04:32:02 2022 ] 	Top5: 98.11%
[ Wed Sep 28 04:32:02 2022 ] Training epoch: 38
[ Wed Sep 28 04:34:58 2022 ] 	Mean training loss: 0.6048. loss2: 0.0000. Mean training acc: 81.10%.
[ Wed Sep 28 04:34:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:34:58 2022 ] Eval epoch: 38
[ Wed Sep 28 04:35:31 2022 ] 	Mean test loss of 296 batches: 0.5516132133433947.
[ Wed Sep 28 04:35:31 2022 ] 	Top1: 82.60%
[ Wed Sep 28 04:35:31 2022 ] 	Top5: 97.59%
[ Wed Sep 28 04:35:31 2022 ] Training epoch: 39
[ Wed Sep 28 04:38:27 2022 ] 	Mean training loss: 0.6037. loss2: 0.0000. Mean training acc: 81.11%.
[ Wed Sep 28 04:38:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:38:27 2022 ] Eval epoch: 39
[ Wed Sep 28 04:39:00 2022 ] 	Mean test loss of 296 batches: 0.5952148024194144.
[ Wed Sep 28 04:39:00 2022 ] 	Top1: 80.90%
[ Wed Sep 28 04:39:00 2022 ] 	Top5: 97.22%
[ Wed Sep 28 04:39:01 2022 ] Training epoch: 40
[ Wed Sep 28 04:41:56 2022 ] 	Mean training loss: 0.6089. loss2: 0.0000. Mean training acc: 80.97%.
[ Wed Sep 28 04:41:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:41:56 2022 ] Eval epoch: 40
[ Wed Sep 28 04:42:29 2022 ] 	Mean test loss of 296 batches: 0.5022594400152967.
[ Wed Sep 28 04:42:30 2022 ] 	Top1: 84.11%
[ Wed Sep 28 04:42:30 2022 ] 	Top5: 97.95%
[ Wed Sep 28 04:42:30 2022 ] Training epoch: 41
[ Wed Sep 28 04:45:26 2022 ] 	Mean training loss: 0.6005. loss2: 0.0000. Mean training acc: 81.36%.
[ Wed Sep 28 04:45:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:45:26 2022 ] Eval epoch: 41
[ Wed Sep 28 04:45:59 2022 ] 	Mean test loss of 296 batches: 0.8052053035715142.
[ Wed Sep 28 04:45:59 2022 ] 	Top1: 75.29%
[ Wed Sep 28 04:45:59 2022 ] 	Top5: 94.66%
[ Wed Sep 28 04:45:59 2022 ] Training epoch: 42
[ Wed Sep 28 04:48:55 2022 ] 	Mean training loss: 0.5924. loss2: 0.0000. Mean training acc: 81.19%.
[ Wed Sep 28 04:48:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:48:55 2022 ] Eval epoch: 42
[ Wed Sep 28 04:49:28 2022 ] 	Mean test loss of 296 batches: 0.631546570944625.
[ Wed Sep 28 04:49:28 2022 ] 	Top1: 80.36%
[ Wed Sep 28 04:49:28 2022 ] 	Top5: 96.93%
[ Wed Sep 28 04:49:28 2022 ] Training epoch: 43
[ Wed Sep 28 04:52:24 2022 ] 	Mean training loss: 0.5989. loss2: 0.0000. Mean training acc: 81.16%.
[ Wed Sep 28 04:52:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:52:24 2022 ] Eval epoch: 43
[ Wed Sep 28 04:52:57 2022 ] 	Mean test loss of 296 batches: 0.9656988402476182.
[ Wed Sep 28 04:52:57 2022 ] 	Top1: 70.00%
[ Wed Sep 28 04:52:57 2022 ] 	Top5: 93.95%
[ Wed Sep 28 04:52:57 2022 ] Training epoch: 44
[ Wed Sep 28 04:55:53 2022 ] 	Mean training loss: 0.5964. loss2: 0.0000. Mean training acc: 81.35%.
[ Wed Sep 28 04:55:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:55:53 2022 ] Eval epoch: 44
[ Wed Sep 28 04:56:26 2022 ] 	Mean test loss of 296 batches: 0.5102506080088584.
[ Wed Sep 28 04:56:26 2022 ] 	Top1: 83.92%
[ Wed Sep 28 04:56:26 2022 ] 	Top5: 97.93%
[ Wed Sep 28 04:56:26 2022 ] Training epoch: 45
[ Wed Sep 28 04:59:22 2022 ] 	Mean training loss: 0.5976. loss2: 0.0000. Mean training acc: 81.32%.
[ Wed Sep 28 04:59:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:59:22 2022 ] Eval epoch: 45
[ Wed Sep 28 04:59:55 2022 ] 	Mean test loss of 296 batches: 0.7132521185621217.
[ Wed Sep 28 04:59:55 2022 ] 	Top1: 78.82%
[ Wed Sep 28 04:59:55 2022 ] 	Top5: 96.11%
[ Wed Sep 28 04:59:55 2022 ] Training epoch: 46
[ Wed Sep 28 05:02:51 2022 ] 	Mean training loss: 0.5847. loss2: 0.0000. Mean training acc: 81.57%.
[ Wed Sep 28 05:02:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:02:51 2022 ] Eval epoch: 46
[ Wed Sep 28 05:03:25 2022 ] 	Mean test loss of 296 batches: 0.4992698964637679.
[ Wed Sep 28 05:03:25 2022 ] 	Top1: 84.34%
[ Wed Sep 28 05:03:25 2022 ] 	Top5: 97.89%
[ Wed Sep 28 05:03:25 2022 ] Training epoch: 47
[ Wed Sep 28 05:06:20 2022 ] 	Mean training loss: 0.5912. loss2: 0.0000. Mean training acc: 81.42%.
[ Wed Sep 28 05:06:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:06:20 2022 ] Eval epoch: 47
[ Wed Sep 28 05:06:53 2022 ] 	Mean test loss of 296 batches: 0.5880059069274245.
[ Wed Sep 28 05:06:54 2022 ] 	Top1: 82.22%
[ Wed Sep 28 05:06:54 2022 ] 	Top5: 97.19%
[ Wed Sep 28 05:06:54 2022 ] Training epoch: 48
[ Wed Sep 28 05:09:49 2022 ] 	Mean training loss: 0.5995. loss2: 0.0000. Mean training acc: 81.28%.
[ Wed Sep 28 05:09:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:09:49 2022 ] Eval epoch: 48
[ Wed Sep 28 05:10:22 2022 ] 	Mean test loss of 296 batches: 0.543270937393646.
[ Wed Sep 28 05:10:23 2022 ] 	Top1: 82.83%
[ Wed Sep 28 05:10:23 2022 ] 	Top5: 97.64%
[ Wed Sep 28 05:10:23 2022 ] Training epoch: 49
[ Wed Sep 28 05:13:19 2022 ] 	Mean training loss: 0.5912. loss2: 0.0000. Mean training acc: 81.48%.
[ Wed Sep 28 05:13:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:13:19 2022 ] Eval epoch: 49
[ Wed Sep 28 05:13:52 2022 ] 	Mean test loss of 296 batches: 0.5641246770282049.
[ Wed Sep 28 05:13:52 2022 ] 	Top1: 81.88%
[ Wed Sep 28 05:13:53 2022 ] 	Top5: 97.68%
[ Wed Sep 28 05:13:53 2022 ] Training epoch: 50
[ Wed Sep 28 05:16:48 2022 ] 	Mean training loss: 0.5892. loss2: 0.0000. Mean training acc: 81.69%.
[ Wed Sep 28 05:16:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:16:48 2022 ] Eval epoch: 50
[ Wed Sep 28 05:17:21 2022 ] 	Mean test loss of 296 batches: 0.5572891673324881.
[ Wed Sep 28 05:17:21 2022 ] 	Top1: 82.66%
[ Wed Sep 28 05:17:22 2022 ] 	Top5: 97.53%
[ Wed Sep 28 05:17:22 2022 ] Training epoch: 51
[ Wed Sep 28 05:20:18 2022 ] 	Mean training loss: 0.5780. loss2: 0.0000. Mean training acc: 82.21%.
[ Wed Sep 28 05:20:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:20:18 2022 ] Eval epoch: 51
[ Wed Sep 28 05:20:51 2022 ] 	Mean test loss of 296 batches: 0.5927392882672516.
[ Wed Sep 28 05:20:51 2022 ] 	Top1: 80.97%
[ Wed Sep 28 05:20:51 2022 ] 	Top5: 97.16%
[ Wed Sep 28 05:20:51 2022 ] Training epoch: 52
[ Wed Sep 28 05:23:47 2022 ] 	Mean training loss: 0.5942. loss2: 0.0000. Mean training acc: 81.36%.
[ Wed Sep 28 05:23:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:23:47 2022 ] Eval epoch: 52
[ Wed Sep 28 05:24:20 2022 ] 	Mean test loss of 296 batches: 0.7190010486220991.
[ Wed Sep 28 05:24:20 2022 ] 	Top1: 78.25%
[ Wed Sep 28 05:24:20 2022 ] 	Top5: 96.15%
[ Wed Sep 28 05:24:20 2022 ] Training epoch: 53
[ Wed Sep 28 05:27:16 2022 ] 	Mean training loss: 0.5856. loss2: 0.0000. Mean training acc: 81.58%.
[ Wed Sep 28 05:27:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:27:16 2022 ] Eval epoch: 53
[ Wed Sep 28 05:27:49 2022 ] 	Mean test loss of 296 batches: 0.5551096325287143.
[ Wed Sep 28 05:27:49 2022 ] 	Top1: 82.71%
[ Wed Sep 28 05:27:49 2022 ] 	Top5: 97.71%
[ Wed Sep 28 05:27:49 2022 ] Training epoch: 54
[ Wed Sep 28 05:30:45 2022 ] 	Mean training loss: 0.5906. loss2: 0.0000. Mean training acc: 81.40%.
[ Wed Sep 28 05:30:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:30:45 2022 ] Eval epoch: 54
[ Wed Sep 28 05:31:18 2022 ] 	Mean test loss of 296 batches: 0.5780266008868411.
[ Wed Sep 28 05:31:18 2022 ] 	Top1: 81.61%
[ Wed Sep 28 05:31:18 2022 ] 	Top5: 97.97%
[ Wed Sep 28 05:31:18 2022 ] Training epoch: 55
[ Wed Sep 28 05:34:14 2022 ] 	Mean training loss: 0.5861. loss2: 0.0000. Mean training acc: 81.54%.
[ Wed Sep 28 05:34:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:34:14 2022 ] Eval epoch: 55
[ Wed Sep 28 05:34:47 2022 ] 	Mean test loss of 296 batches: 0.5548952551608956.
[ Wed Sep 28 05:34:47 2022 ] 	Top1: 82.60%
[ Wed Sep 28 05:34:48 2022 ] 	Top5: 97.64%
[ Wed Sep 28 05:34:48 2022 ] Training epoch: 56
[ Wed Sep 28 05:37:44 2022 ] 	Mean training loss: 0.5799. loss2: 0.0000. Mean training acc: 82.00%.
[ Wed Sep 28 05:37:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:37:44 2022 ] Eval epoch: 56
[ Wed Sep 28 05:38:17 2022 ] 	Mean test loss of 296 batches: 0.5750925205647945.
[ Wed Sep 28 05:38:17 2022 ] 	Top1: 81.84%
[ Wed Sep 28 05:38:17 2022 ] 	Top5: 97.27%
[ Wed Sep 28 05:38:17 2022 ] Training epoch: 57
[ Wed Sep 28 05:41:13 2022 ] 	Mean training loss: 0.5753. loss2: 0.0000. Mean training acc: 82.03%.
[ Wed Sep 28 05:41:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:41:13 2022 ] Eval epoch: 57
[ Wed Sep 28 05:41:46 2022 ] 	Mean test loss of 296 batches: 1.0042929708756305.
[ Wed Sep 28 05:41:46 2022 ] 	Top1: 70.23%
[ Wed Sep 28 05:41:46 2022 ] 	Top5: 93.48%
[ Wed Sep 28 05:41:46 2022 ] Training epoch: 58
[ Wed Sep 28 05:44:42 2022 ] 	Mean training loss: 0.5836. loss2: 0.0000. Mean training acc: 81.53%.
[ Wed Sep 28 05:44:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:44:42 2022 ] Eval epoch: 58
[ Wed Sep 28 05:45:15 2022 ] 	Mean test loss of 296 batches: 0.5127533757304018.
[ Wed Sep 28 05:45:15 2022 ] 	Top1: 84.25%
[ Wed Sep 28 05:45:15 2022 ] 	Top5: 97.88%
[ Wed Sep 28 05:45:15 2022 ] Training epoch: 59
[ Wed Sep 28 05:48:12 2022 ] 	Mean training loss: 0.5796. loss2: 0.0000. Mean training acc: 81.84%.
[ Wed Sep 28 05:48:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:48:12 2022 ] Eval epoch: 59
[ Wed Sep 28 05:48:45 2022 ] 	Mean test loss of 296 batches: 0.6150324321981218.
[ Wed Sep 28 05:48:45 2022 ] 	Top1: 81.48%
[ Wed Sep 28 05:48:45 2022 ] 	Top5: 96.84%
[ Wed Sep 28 05:48:45 2022 ] Training epoch: 60
[ Wed Sep 28 05:51:41 2022 ] 	Mean training loss: 0.5791. loss2: 0.0000. Mean training acc: 81.86%.
[ Wed Sep 28 05:51:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:51:41 2022 ] Eval epoch: 60
[ Wed Sep 28 05:52:14 2022 ] 	Mean test loss of 296 batches: 0.6017228405721284.
[ Wed Sep 28 05:52:14 2022 ] 	Top1: 80.40%
[ Wed Sep 28 05:52:14 2022 ] 	Top5: 97.62%
[ Wed Sep 28 05:52:14 2022 ] Training epoch: 61
[ Wed Sep 28 05:55:11 2022 ] 	Mean training loss: 0.5743. loss2: 0.0000. Mean training acc: 82.02%.
[ Wed Sep 28 05:55:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:55:11 2022 ] Eval epoch: 61
[ Wed Sep 28 05:55:44 2022 ] 	Mean test loss of 296 batches: 0.53947125722629.
[ Wed Sep 28 05:55:44 2022 ] 	Top1: 82.63%
[ Wed Sep 28 05:55:44 2022 ] 	Top5: 97.72%
[ Wed Sep 28 05:55:44 2022 ] Training epoch: 62
[ Wed Sep 28 05:58:40 2022 ] 	Mean training loss: 0.5761. loss2: 0.0000. Mean training acc: 81.82%.
[ Wed Sep 28 05:58:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:58:40 2022 ] Eval epoch: 62
[ Wed Sep 28 05:59:14 2022 ] 	Mean test loss of 296 batches: 0.49161107878427246.
[ Wed Sep 28 05:59:14 2022 ] 	Top1: 84.45%
[ Wed Sep 28 05:59:14 2022 ] 	Top5: 97.96%
[ Wed Sep 28 05:59:14 2022 ] Training epoch: 63
[ Wed Sep 28 06:02:10 2022 ] 	Mean training loss: 0.5737. loss2: 0.0000. Mean training acc: 82.08%.
[ Wed Sep 28 06:02:10 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:02:10 2022 ] Eval epoch: 63
[ Wed Sep 28 06:02:43 2022 ] 	Mean test loss of 296 batches: 0.4979069246784658.
[ Wed Sep 28 06:02:43 2022 ] 	Top1: 84.24%
[ Wed Sep 28 06:02:43 2022 ] 	Top5: 97.68%
[ Wed Sep 28 06:02:43 2022 ] Training epoch: 64
[ Wed Sep 28 06:05:39 2022 ] 	Mean training loss: 0.5715. loss2: 0.0000. Mean training acc: 82.17%.
[ Wed Sep 28 06:05:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:05:39 2022 ] Eval epoch: 64
[ Wed Sep 28 06:06:12 2022 ] 	Mean test loss of 296 batches: 0.5534244071490861.
[ Wed Sep 28 06:06:12 2022 ] 	Top1: 82.91%
[ Wed Sep 28 06:06:12 2022 ] 	Top5: 97.57%
[ Wed Sep 28 06:06:12 2022 ] Training epoch: 65
[ Wed Sep 28 06:09:08 2022 ] 	Mean training loss: 0.5728. loss2: 0.0000. Mean training acc: 82.20%.
[ Wed Sep 28 06:09:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:09:08 2022 ] Eval epoch: 65
[ Wed Sep 28 06:09:41 2022 ] 	Mean test loss of 296 batches: 0.5414178879881227.
[ Wed Sep 28 06:09:41 2022 ] 	Top1: 82.66%
[ Wed Sep 28 06:09:41 2022 ] 	Top5: 97.67%
[ Wed Sep 28 06:09:41 2022 ] Training epoch: 66
[ Wed Sep 28 06:12:37 2022 ] 	Mean training loss: 0.5779. loss2: 0.0000. Mean training acc: 82.07%.
[ Wed Sep 28 06:12:37 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:12:37 2022 ] Eval epoch: 66
[ Wed Sep 28 06:13:10 2022 ] 	Mean test loss of 296 batches: 0.5216865145878212.
[ Wed Sep 28 06:13:10 2022 ] 	Top1: 83.50%
[ Wed Sep 28 06:13:10 2022 ] 	Top5: 98.02%
[ Wed Sep 28 06:13:10 2022 ] Training epoch: 67
[ Wed Sep 28 06:16:06 2022 ] 	Mean training loss: 0.5830. loss2: 0.0000. Mean training acc: 81.72%.
[ Wed Sep 28 06:16:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:16:06 2022 ] Eval epoch: 67
[ Wed Sep 28 06:16:39 2022 ] 	Mean test loss of 296 batches: 0.5258847859159514.
[ Wed Sep 28 06:16:39 2022 ] 	Top1: 83.35%
[ Wed Sep 28 06:16:39 2022 ] 	Top5: 97.71%
[ Wed Sep 28 06:16:39 2022 ] Training epoch: 68
[ Wed Sep 28 06:19:35 2022 ] 	Mean training loss: 0.5753. loss2: 0.0000. Mean training acc: 81.96%.
[ Wed Sep 28 06:19:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:19:35 2022 ] Eval epoch: 68
[ Wed Sep 28 06:20:08 2022 ] 	Mean test loss of 296 batches: 0.6114265589999992.
[ Wed Sep 28 06:20:08 2022 ] 	Top1: 81.28%
[ Wed Sep 28 06:20:08 2022 ] 	Top5: 97.00%
[ Wed Sep 28 06:20:08 2022 ] Training epoch: 69
[ Wed Sep 28 06:23:04 2022 ] 	Mean training loss: 0.5751. loss2: 0.0000. Mean training acc: 82.02%.
[ Wed Sep 28 06:23:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:23:04 2022 ] Eval epoch: 69
[ Wed Sep 28 06:23:37 2022 ] 	Mean test loss of 296 batches: 0.5138386175841898.
[ Wed Sep 28 06:23:37 2022 ] 	Top1: 84.32%
[ Wed Sep 28 06:23:38 2022 ] 	Top5: 97.74%
[ Wed Sep 28 06:23:38 2022 ] Training epoch: 70
[ Wed Sep 28 06:26:33 2022 ] 	Mean training loss: 0.5679. loss2: 0.0000. Mean training acc: 82.23%.
[ Wed Sep 28 06:26:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:26:33 2022 ] Eval epoch: 70
[ Wed Sep 28 06:27:07 2022 ] 	Mean test loss of 296 batches: 0.5206541245853579.
[ Wed Sep 28 06:27:07 2022 ] 	Top1: 83.34%
[ Wed Sep 28 06:27:07 2022 ] 	Top5: 97.92%
[ Wed Sep 28 06:27:07 2022 ] Training epoch: 71
[ Wed Sep 28 06:30:03 2022 ] 	Mean training loss: 0.5727. loss2: 0.0000. Mean training acc: 82.02%.
[ Wed Sep 28 06:30:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:30:03 2022 ] Eval epoch: 71
[ Wed Sep 28 06:30:37 2022 ] 	Mean test loss of 296 batches: 0.5458160746540572.
[ Wed Sep 28 06:30:37 2022 ] 	Top1: 83.32%
[ Wed Sep 28 06:30:37 2022 ] 	Top5: 97.86%
[ Wed Sep 28 06:30:37 2022 ] Training epoch: 72
[ Wed Sep 28 06:33:33 2022 ] 	Mean training loss: 0.5759. loss2: 0.0000. Mean training acc: 82.04%.
[ Wed Sep 28 06:33:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:33:33 2022 ] Eval epoch: 72
[ Wed Sep 28 06:34:06 2022 ] 	Mean test loss of 296 batches: 0.552530696162501.
[ Wed Sep 28 06:34:06 2022 ] 	Top1: 82.44%
[ Wed Sep 28 06:34:06 2022 ] 	Top5: 97.91%
[ Wed Sep 28 06:34:06 2022 ] Training epoch: 73
[ Wed Sep 28 06:37:02 2022 ] 	Mean training loss: 0.5729. loss2: 0.0000. Mean training acc: 82.03%.
[ Wed Sep 28 06:37:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:37:02 2022 ] Eval epoch: 73
[ Wed Sep 28 06:37:35 2022 ] 	Mean test loss of 296 batches: 0.569803856011178.
[ Wed Sep 28 06:37:36 2022 ] 	Top1: 82.10%
[ Wed Sep 28 06:37:36 2022 ] 	Top5: 97.95%
[ Wed Sep 28 06:37:36 2022 ] Training epoch: 74
[ Wed Sep 28 06:40:31 2022 ] 	Mean training loss: 0.5740. loss2: 0.0000. Mean training acc: 82.19%.
[ Wed Sep 28 06:40:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:40:31 2022 ] Eval epoch: 74
[ Wed Sep 28 06:41:05 2022 ] 	Mean test loss of 296 batches: 0.5249828450583123.
[ Wed Sep 28 06:41:05 2022 ] 	Top1: 83.43%
[ Wed Sep 28 06:41:05 2022 ] 	Top5: 97.59%
[ Wed Sep 28 06:41:05 2022 ] Training epoch: 75
[ Wed Sep 28 06:44:00 2022 ] 	Mean training loss: 0.5638. loss2: 0.0000. Mean training acc: 82.32%.
[ Wed Sep 28 06:44:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:44:01 2022 ] Eval epoch: 75
[ Wed Sep 28 06:44:34 2022 ] 	Mean test loss of 296 batches: 0.469929214365579.
[ Wed Sep 28 06:44:34 2022 ] 	Top1: 85.47%
[ Wed Sep 28 06:44:34 2022 ] 	Top5: 98.15%
[ Wed Sep 28 06:44:34 2022 ] Training epoch: 76
[ Wed Sep 28 06:47:29 2022 ] 	Mean training loss: 0.5784. loss2: 0.0000. Mean training acc: 82.03%.
[ Wed Sep 28 06:47:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:47:29 2022 ] Eval epoch: 76
[ Wed Sep 28 06:48:02 2022 ] 	Mean test loss of 296 batches: 0.5364730654133333.
[ Wed Sep 28 06:48:03 2022 ] 	Top1: 82.46%
[ Wed Sep 28 06:48:03 2022 ] 	Top5: 97.90%
[ Wed Sep 28 06:48:03 2022 ] Training epoch: 77
[ Wed Sep 28 06:50:59 2022 ] 	Mean training loss: 0.5669. loss2: 0.0000. Mean training acc: 82.21%.
[ Wed Sep 28 06:50:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:50:59 2022 ] Eval epoch: 77
[ Wed Sep 28 06:51:32 2022 ] 	Mean test loss of 296 batches: 0.6394433090211572.
[ Wed Sep 28 06:51:32 2022 ] 	Top1: 80.79%
[ Wed Sep 28 06:51:32 2022 ] 	Top5: 96.88%
[ Wed Sep 28 06:51:32 2022 ] Training epoch: 78
[ Wed Sep 28 06:54:29 2022 ] 	Mean training loss: 0.5712. loss2: 0.0000. Mean training acc: 82.33%.
[ Wed Sep 28 06:54:29 2022 ] 	Time consumption: [Data]02%, [Network]97%
[ Wed Sep 28 06:54:29 2022 ] Eval epoch: 78
[ Wed Sep 28 06:55:02 2022 ] 	Mean test loss of 296 batches: 0.4913474573577578.
[ Wed Sep 28 06:55:02 2022 ] 	Top1: 84.59%
[ Wed Sep 28 06:55:03 2022 ] 	Top5: 97.96%
[ Wed Sep 28 06:55:03 2022 ] Training epoch: 79
[ Wed Sep 28 06:57:58 2022 ] 	Mean training loss: 0.5692. loss2: 0.0000. Mean training acc: 82.03%.
[ Wed Sep 28 06:57:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:57:58 2022 ] Eval epoch: 79
[ Wed Sep 28 06:58:31 2022 ] 	Mean test loss of 296 batches: 0.6205252492045229.
[ Wed Sep 28 06:58:31 2022 ] 	Top1: 80.97%
[ Wed Sep 28 06:58:32 2022 ] 	Top5: 97.93%
[ Wed Sep 28 06:58:32 2022 ] Training epoch: 80
[ Wed Sep 28 07:01:27 2022 ] 	Mean training loss: 0.5732. loss2: 0.0000. Mean training acc: 82.11%.
[ Wed Sep 28 07:01:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:01:27 2022 ] Eval epoch: 80
[ Wed Sep 28 07:02:00 2022 ] 	Mean test loss of 296 batches: 0.5928482293981958.
[ Wed Sep 28 07:02:00 2022 ] 	Top1: 82.11%
[ Wed Sep 28 07:02:01 2022 ] 	Top5: 96.84%
[ Wed Sep 28 07:02:01 2022 ] Training epoch: 81
[ Wed Sep 28 07:04:56 2022 ] 	Mean training loss: 0.5593. loss2: 0.0000. Mean training acc: 82.24%.
[ Wed Sep 28 07:04:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:04:56 2022 ] Eval epoch: 81
[ Wed Sep 28 07:05:29 2022 ] 	Mean test loss of 296 batches: 0.5317304251665199.
[ Wed Sep 28 07:05:30 2022 ] 	Top1: 83.68%
[ Wed Sep 28 07:05:30 2022 ] 	Top5: 97.68%
[ Wed Sep 28 07:05:30 2022 ] Training epoch: 82
[ Wed Sep 28 07:08:26 2022 ] 	Mean training loss: 0.5648. loss2: 0.0000. Mean training acc: 82.19%.
[ Wed Sep 28 07:08:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:08:26 2022 ] Eval epoch: 82
[ Wed Sep 28 07:08:59 2022 ] 	Mean test loss of 296 batches: 0.5273484931000181.
[ Wed Sep 28 07:08:59 2022 ] 	Top1: 82.84%
[ Wed Sep 28 07:08:59 2022 ] 	Top5: 98.12%
[ Wed Sep 28 07:08:59 2022 ] Training epoch: 83
[ Wed Sep 28 07:11:55 2022 ] 	Mean training loss: 0.5661. loss2: 0.0000. Mean training acc: 82.22%.
[ Wed Sep 28 07:11:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:11:55 2022 ] Eval epoch: 83
[ Wed Sep 28 07:12:28 2022 ] 	Mean test loss of 296 batches: 0.5565441941993462.
[ Wed Sep 28 07:12:28 2022 ] 	Top1: 82.52%
[ Wed Sep 28 07:12:28 2022 ] 	Top5: 97.77%
[ Wed Sep 28 07:12:28 2022 ] Training epoch: 84
[ Wed Sep 28 07:15:24 2022 ] 	Mean training loss: 0.5718. loss2: 0.0000. Mean training acc: 82.33%.
[ Wed Sep 28 07:15:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:15:24 2022 ] Eval epoch: 84
[ Wed Sep 28 07:15:57 2022 ] 	Mean test loss of 296 batches: 0.7448868253142447.
[ Wed Sep 28 07:15:57 2022 ] 	Top1: 78.19%
[ Wed Sep 28 07:15:57 2022 ] 	Top5: 95.29%
[ Wed Sep 28 07:15:57 2022 ] Training epoch: 85
[ Wed Sep 28 07:18:53 2022 ] 	Mean training loss: 0.5659. loss2: 0.0000. Mean training acc: 82.28%.
[ Wed Sep 28 07:18:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:18:53 2022 ] Eval epoch: 85
[ Wed Sep 28 07:19:26 2022 ] 	Mean test loss of 296 batches: 0.5440209309092244.
[ Wed Sep 28 07:19:26 2022 ] 	Top1: 83.09%
[ Wed Sep 28 07:19:26 2022 ] 	Top5: 97.55%
[ Wed Sep 28 07:19:26 2022 ] Training epoch: 86
[ Wed Sep 28 07:22:22 2022 ] 	Mean training loss: 0.5650. loss2: 0.0000. Mean training acc: 82.33%.
[ Wed Sep 28 07:22:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:22:22 2022 ] Eval epoch: 86
[ Wed Sep 28 07:22:55 2022 ] 	Mean test loss of 296 batches: 0.4772212727851159.
[ Wed Sep 28 07:22:55 2022 ] 	Top1: 84.97%
[ Wed Sep 28 07:22:55 2022 ] 	Top5: 98.18%
[ Wed Sep 28 07:22:55 2022 ] Training epoch: 87
[ Wed Sep 28 07:25:51 2022 ] 	Mean training loss: 0.5671. loss2: 0.0000. Mean training acc: 82.36%.
[ Wed Sep 28 07:25:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:25:51 2022 ] Eval epoch: 87
[ Wed Sep 28 07:26:24 2022 ] 	Mean test loss of 296 batches: 0.6191518307739013.
[ Wed Sep 28 07:26:24 2022 ] 	Top1: 80.49%
[ Wed Sep 28 07:26:24 2022 ] 	Top5: 96.95%
[ Wed Sep 28 07:26:24 2022 ] Training epoch: 88
[ Wed Sep 28 07:29:20 2022 ] 	Mean training loss: 0.5673. loss2: 0.0000. Mean training acc: 82.38%.
[ Wed Sep 28 07:29:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:29:20 2022 ] Eval epoch: 88
[ Wed Sep 28 07:29:53 2022 ] 	Mean test loss of 296 batches: 0.46594372185299526.
[ Wed Sep 28 07:29:53 2022 ] 	Top1: 84.99%
[ Wed Sep 28 07:29:53 2022 ] 	Top5: 98.29%
[ Wed Sep 28 07:29:53 2022 ] Training epoch: 89
[ Wed Sep 28 07:32:49 2022 ] 	Mean training loss: 0.5617. loss2: 0.0000. Mean training acc: 82.46%.
[ Wed Sep 28 07:32:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:32:49 2022 ] Eval epoch: 89
[ Wed Sep 28 07:33:22 2022 ] 	Mean test loss of 296 batches: 0.5774189530292878.
[ Wed Sep 28 07:33:22 2022 ] 	Top1: 81.86%
[ Wed Sep 28 07:33:22 2022 ] 	Top5: 97.37%
[ Wed Sep 28 07:33:22 2022 ] Training epoch: 90
[ Wed Sep 28 07:36:18 2022 ] 	Mean training loss: 0.5612. loss2: 0.0000. Mean training acc: 82.50%.
[ Wed Sep 28 07:36:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:36:18 2022 ] Eval epoch: 90
[ Wed Sep 28 07:36:51 2022 ] 	Mean test loss of 296 batches: 0.4878024691464128.
[ Wed Sep 28 07:36:51 2022 ] 	Top1: 84.25%
[ Wed Sep 28 07:36:52 2022 ] 	Top5: 98.18%
[ Wed Sep 28 07:36:52 2022 ] Training epoch: 91
[ Wed Sep 28 07:39:47 2022 ] 	Mean training loss: 0.3480. loss2: 0.0000. Mean training acc: 89.24%.
[ Wed Sep 28 07:39:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:39:47 2022 ] Eval epoch: 91
[ Wed Sep 28 07:40:21 2022 ] 	Mean test loss of 296 batches: 0.2553734373288682.
[ Wed Sep 28 07:40:21 2022 ] 	Top1: 91.84%
[ Wed Sep 28 07:40:21 2022 ] 	Top5: 99.17%
[ Wed Sep 28 07:40:21 2022 ] Training epoch: 92
[ Wed Sep 28 07:43:16 2022 ] 	Mean training loss: 0.2813. loss2: 0.0000. Mean training acc: 91.25%.
[ Wed Sep 28 07:43:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:43:16 2022 ] Eval epoch: 92
[ Wed Sep 28 07:43:49 2022 ] 	Mean test loss of 296 batches: 0.24776405532768853.
[ Wed Sep 28 07:43:49 2022 ] 	Top1: 92.05%
[ Wed Sep 28 07:43:49 2022 ] 	Top5: 99.20%
[ Wed Sep 28 07:43:49 2022 ] Training epoch: 93
[ Wed Sep 28 07:46:45 2022 ] 	Mean training loss: 0.2559. loss2: 0.0000. Mean training acc: 92.09%.
[ Wed Sep 28 07:46:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:46:45 2022 ] Eval epoch: 93
[ Wed Sep 28 07:47:18 2022 ] 	Mean test loss of 296 batches: 0.23624259538671658.
[ Wed Sep 28 07:47:18 2022 ] 	Top1: 92.45%
[ Wed Sep 28 07:47:18 2022 ] 	Top5: 99.16%
[ Wed Sep 28 07:47:18 2022 ] Training epoch: 94
[ Wed Sep 28 07:50:14 2022 ] 	Mean training loss: 0.2355. loss2: 0.0000. Mean training acc: 92.88%.
[ Wed Sep 28 07:50:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:50:15 2022 ] Eval epoch: 94
[ Wed Sep 28 07:50:48 2022 ] 	Mean test loss of 296 batches: 0.2398821788563116.
[ Wed Sep 28 07:50:48 2022 ] 	Top1: 92.37%
[ Wed Sep 28 07:50:48 2022 ] 	Top5: 99.24%
[ Wed Sep 28 07:50:48 2022 ] Training epoch: 95
[ Wed Sep 28 07:53:44 2022 ] 	Mean training loss: 0.2198. loss2: 0.0000. Mean training acc: 93.41%.
[ Wed Sep 28 07:53:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:53:44 2022 ] Eval epoch: 95
[ Wed Sep 28 07:54:17 2022 ] 	Mean test loss of 296 batches: 0.2349379403718017.
[ Wed Sep 28 07:54:18 2022 ] 	Top1: 92.44%
[ Wed Sep 28 07:54:18 2022 ] 	Top5: 99.29%
[ Wed Sep 28 07:54:18 2022 ] Training epoch: 96
[ Wed Sep 28 07:57:14 2022 ] 	Mean training loss: 0.2045. loss2: 0.0000. Mean training acc: 93.85%.
[ Wed Sep 28 07:57:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:57:14 2022 ] Eval epoch: 96
[ Wed Sep 28 07:57:47 2022 ] 	Mean test loss of 296 batches: 0.24191647112319195.
[ Wed Sep 28 07:57:47 2022 ] 	Top1: 92.30%
[ Wed Sep 28 07:57:47 2022 ] 	Top5: 99.19%
[ Wed Sep 28 07:57:47 2022 ] Training epoch: 97
[ Wed Sep 28 08:00:43 2022 ] 	Mean training loss: 0.1941. loss2: 0.0000. Mean training acc: 94.25%.
[ Wed Sep 28 08:00:43 2022 ] 	Time consumption: [Data]02%, [Network]97%
[ Wed Sep 28 08:00:44 2022 ] Eval epoch: 97
[ Wed Sep 28 08:01:17 2022 ] 	Mean test loss of 296 batches: 0.24739294038531748.
[ Wed Sep 28 08:01:17 2022 ] 	Top1: 92.20%
[ Wed Sep 28 08:01:17 2022 ] 	Top5: 99.16%
[ Wed Sep 28 08:01:17 2022 ] Training epoch: 98
[ Wed Sep 28 08:04:15 2022 ] 	Mean training loss: 0.1817. loss2: 0.0000. Mean training acc: 94.60%.
[ Wed Sep 28 08:04:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:04:15 2022 ] Eval epoch: 98
[ Wed Sep 28 08:04:48 2022 ] 	Mean test loss of 296 batches: 0.23977903603274073.
[ Wed Sep 28 08:04:48 2022 ] 	Top1: 92.40%
[ Wed Sep 28 08:04:49 2022 ] 	Top5: 99.24%
[ Wed Sep 28 08:04:49 2022 ] Training epoch: 99
[ Wed Sep 28 08:07:44 2022 ] 	Mean training loss: 0.1717. loss2: 0.0000. Mean training acc: 94.99%.
[ Wed Sep 28 08:07:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:07:44 2022 ] Eval epoch: 99
[ Wed Sep 28 08:08:17 2022 ] 	Mean test loss of 296 batches: 0.24803300677622492.
[ Wed Sep 28 08:08:17 2022 ] 	Top1: 92.19%
[ Wed Sep 28 08:08:18 2022 ] 	Top5: 99.25%
[ Wed Sep 28 08:08:18 2022 ] Training epoch: 100
[ Wed Sep 28 08:11:13 2022 ] 	Mean training loss: 0.1666. loss2: 0.0000. Mean training acc: 95.11%.
[ Wed Sep 28 08:11:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:11:13 2022 ] Eval epoch: 100
[ Wed Sep 28 08:11:47 2022 ] 	Mean test loss of 296 batches: 0.250569213722908.
[ Wed Sep 28 08:11:47 2022 ] 	Top1: 92.18%
[ Wed Sep 28 08:11:47 2022 ] 	Top5: 99.17%
[ Wed Sep 28 08:11:47 2022 ] Training epoch: 101
[ Wed Sep 28 08:14:42 2022 ] 	Mean training loss: 0.1370. loss2: 0.0000. Mean training acc: 96.30%.
[ Wed Sep 28 08:14:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:14:42 2022 ] Eval epoch: 101
[ Wed Sep 28 08:15:16 2022 ] 	Mean test loss of 296 batches: 0.23389365474341084.
[ Wed Sep 28 08:15:16 2022 ] 	Top1: 92.64%
[ Wed Sep 28 08:15:16 2022 ] 	Top5: 99.23%
[ Wed Sep 28 08:15:16 2022 ] Training epoch: 102
[ Wed Sep 28 08:18:11 2022 ] 	Mean training loss: 0.1247. loss2: 0.0000. Mean training acc: 96.56%.
[ Wed Sep 28 08:18:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:18:12 2022 ] Eval epoch: 102
[ Wed Sep 28 08:18:45 2022 ] 	Mean test loss of 296 batches: 0.23281940034389295.
[ Wed Sep 28 08:18:45 2022 ] 	Top1: 92.66%
[ Wed Sep 28 08:18:45 2022 ] 	Top5: 99.22%
[ Wed Sep 28 08:18:45 2022 ] Training epoch: 103
[ Wed Sep 28 08:21:41 2022 ] 	Mean training loss: 0.1191. loss2: 0.0000. Mean training acc: 96.88%.
[ Wed Sep 28 08:21:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:21:41 2022 ] Eval epoch: 103
[ Wed Sep 28 08:22:14 2022 ] 	Mean test loss of 296 batches: 0.2283767740607161.
[ Wed Sep 28 08:22:14 2022 ] 	Top1: 92.87%
[ Wed Sep 28 08:22:14 2022 ] 	Top5: 99.25%
[ Wed Sep 28 08:22:14 2022 ] Training epoch: 104
[ Wed Sep 28 08:25:10 2022 ] 	Mean training loss: 0.1124. loss2: 0.0000. Mean training acc: 96.99%.
[ Wed Sep 28 08:25:10 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:25:10 2022 ] Eval epoch: 104
[ Wed Sep 28 08:25:43 2022 ] 	Mean test loss of 296 batches: 0.23530981390158068.
[ Wed Sep 28 08:25:43 2022 ] 	Top1: 92.81%
[ Wed Sep 28 08:25:43 2022 ] 	Top5: 99.19%
[ Wed Sep 28 08:25:43 2022 ] Training epoch: 105
[ Wed Sep 28 08:28:39 2022 ] 	Mean training loss: 0.1098. loss2: 0.0000. Mean training acc: 97.12%.
[ Wed Sep 28 08:28:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:28:39 2022 ] Eval epoch: 105
[ Wed Sep 28 08:29:13 2022 ] 	Mean test loss of 296 batches: 0.2356087385639045.
[ Wed Sep 28 08:29:13 2022 ] 	Top1: 92.75%
[ Wed Sep 28 08:29:13 2022 ] 	Top5: 99.20%
[ Wed Sep 28 08:29:13 2022 ] Training epoch: 106
[ Wed Sep 28 08:32:09 2022 ] 	Mean training loss: 0.1074. loss2: 0.0000. Mean training acc: 97.19%.
[ Wed Sep 28 08:32:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:32:09 2022 ] Eval epoch: 106
[ Wed Sep 28 08:32:42 2022 ] 	Mean test loss of 296 batches: 0.233956929478744.
[ Wed Sep 28 08:32:42 2022 ] 	Top1: 92.67%
[ Wed Sep 28 08:32:42 2022 ] 	Top5: 99.26%
[ Wed Sep 28 08:32:42 2022 ] Training epoch: 107
[ Wed Sep 28 08:35:38 2022 ] 	Mean training loss: 0.1034. loss2: 0.0000. Mean training acc: 97.34%.
[ Wed Sep 28 08:35:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:35:38 2022 ] Eval epoch: 107
[ Wed Sep 28 08:36:11 2022 ] 	Mean test loss of 296 batches: 0.24662371003068984.
[ Wed Sep 28 08:36:11 2022 ] 	Top1: 92.49%
[ Wed Sep 28 08:36:11 2022 ] 	Top5: 99.18%
[ Wed Sep 28 08:36:11 2022 ] Training epoch: 108
[ Wed Sep 28 08:39:07 2022 ] 	Mean training loss: 0.1035. loss2: 0.0000. Mean training acc: 97.31%.
[ Wed Sep 28 08:39:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:39:07 2022 ] Eval epoch: 108
[ Wed Sep 28 08:39:40 2022 ] 	Mean test loss of 296 batches: 0.23577094606652454.
[ Wed Sep 28 08:39:40 2022 ] 	Top1: 92.81%
[ Wed Sep 28 08:39:40 2022 ] 	Top5: 99.26%
[ Wed Sep 28 08:39:40 2022 ] Training epoch: 109
[ Wed Sep 28 08:42:36 2022 ] 	Mean training loss: 0.1032. loss2: 0.0000. Mean training acc: 97.41%.
[ Wed Sep 28 08:42:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:42:36 2022 ] Eval epoch: 109
[ Wed Sep 28 08:43:09 2022 ] 	Mean test loss of 296 batches: 0.2310425741849719.
[ Wed Sep 28 08:43:09 2022 ] 	Top1: 92.91%
[ Wed Sep 28 08:43:10 2022 ] 	Top5: 99.22%
[ Wed Sep 28 08:43:10 2022 ] Training epoch: 110
[ Wed Sep 28 08:46:05 2022 ] 	Mean training loss: 0.0965. loss2: 0.0000. Mean training acc: 97.63%.
[ Wed Sep 28 08:46:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:46:05 2022 ] Eval epoch: 110
[ Wed Sep 28 08:46:38 2022 ] 	Mean test loss of 296 batches: 0.2441387559111054.
[ Wed Sep 28 08:46:39 2022 ] 	Top1: 92.59%
[ Wed Sep 28 08:46:39 2022 ] 	Top5: 99.20%
[ Wed Sep 28 08:47:12 2022 ] Best accuracy: 0.9291147263891824
[ Wed Sep 28 08:47:12 2022 ] Epoch number: 109
[ Wed Sep 28 08:47:12 2022 ] Model name: work_dir/ntu60/cview/fc_vel
[ Wed Sep 28 08:47:12 2022 ] Model total number of params: 2082097
[ Wed Sep 28 08:47:12 2022 ] Weight decay: 0.0004
[ Wed Sep 28 08:47:12 2022 ] Base LR: 0.1
[ Wed Sep 28 08:47:12 2022 ] Batch Size: 64
[ Wed Sep 28 08:47:12 2022 ] Test Batch Size: 64
[ Wed Sep 28 08:47:12 2022 ] seed: 1
